Goto

Collaborating Authors

 safe word


AI can easily impersonate you. This trick helps thwart scammers

PCWorld

AI's rapidly expanding capabilities include convincing impersonations--that is, audio and video that sounds and looks like you. Sometimes these deepfakes can be harmless, part of a joke or meme that involves a celebrity, politician, or other public figure. But as you might guess, scammers also use them to steal money from the unsuspecting. Most of the time, this style of scheme–often called a "grandparent scam"–catches people off-guard. Because they don't realize how easy and sophisticated this technology has become.


AI Scam Calls: How to Protect Yourself, How to Detect

WIRED

You answer a random call from a family member, and they breathlessly explain how there's been a horrible car accident. They need you to send money right now, or they'll go to jail. You can hear the desperation in their voice as they plead for an immediate cash transfer. While it sure sounds like them, and the call came from their number, you feel like something's off. So, you decide to hang up and call them right back.


Should you have a family 'safe word' against AI voice-spoofing scams?

Washington Post - Technology News

For example, Woods said when his daughter was five years old, she would pretend to use a magic wand to turn him into a puppy. The question "What did you do with the wand?" could be a personal detail she would know but an impostor couldn't guess or figure out from online searches.